Web Survey Bibliography
Title Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance for data quality
Author Wetzlehuetter, D.
Year 2017
Access date 15.09.2017
Abstract Starting point and focus: It is not possible to ignore the internet as a quick, practicable and economic source of information and nearly unlimited communication channel, as a mass medium (online news), a mainstream medium (social media) as well as an individual medium (email). The number of web surveys and methods of taking web surveys increased with the utilisation of the internet. For instance, the Arbeitskreis Deutscher Markt- und Sozialforschungsinstitute e.V. recorded a continuous increase from 1% quantitative web surveys of their members in 1998 to 16% in 2004, 38% in 2010 and 43% in 2014. However, webbased surveys – as extensive discussions show – are not free of controversy. Questable data quality, typically regarding the representativeness of the data (coverage error / missing data) and difficulties to achieve unbiased responses (measurement errors) caused by the equipment used (mode-effects) is more and more common. Errors caused by continuous rising proportions of drop-outs and item-nonresponses in online surveys, are relevant in almost the same manner. However, these sources of error are repeatedly neglected to a certain degree.
As the starting point of the paper, it is assumed that drop-out rates and item-nonresponse rates in online surveys differ as context-sensitive (whether at home or not and using a smart-phone or not) response behaviour. This means that systematic errors linked to the interview situation (in terms of location and device) are conceivable. Respectively, the presentation aims to illustrate, how/to what extend the context of the interview situation has to be considered for data cleansing and analysis of data captured online to avoid, as far as possible, biased results.
Methods and Data: To test this assumption, an online survey about “participation of university students” is used. To provoke drop-outs on the one hand and on the other hand test the consequences of different motivation strategies (prospect of profit, appeals, manipulation of the progress bar) that are easily inserted and therefore often used in online surveys, an experimental design was applied. For this purpose, an unusually long questionnaire (23 online-pages, 121 items) was developed, wherein different motivation strategies were included. 14.2% of the students (n=17,491) invited to take part in the survey reacted to the invitation, 1916 (11%) answered at least one question; just 7.3% (n=1282) reached the final page.
Results: Drop-out-rates and item-nonresponse-rates differ, depending on the above specified survey context: not being at home and using a smart-phone increases both. The motivation strategies used work differently: they solely reduce the risk of non-responses of those who did not use a smart-phone while at home. However, data cleansing does not affect the sample composition concerning studyrelated characteristics. Detailed analyses show that the influence of the defined survey context on substantial findings varies. Based on this the presentation will emphasize the importance of recording and considering the context-information of data collection for data cleansing, analysis and interpretation of results and will discuss how this
As the starting point of the paper, it is assumed that drop-out rates and item-nonresponse rates in online surveys differ as context-sensitive (whether at home or not and using a smart-phone or not) response behaviour. This means that systematic errors linked to the interview situation (in terms of location and device) are conceivable. Respectively, the presentation aims to illustrate, how/to what extend the context of the interview situation has to be considered for data cleansing and analysis of data captured online to avoid, as far as possible, biased results.
Methods and Data: To test this assumption, an online survey about “participation of university students” is used. To provoke drop-outs on the one hand and on the other hand test the consequences of different motivation strategies (prospect of profit, appeals, manipulation of the progress bar) that are easily inserted and therefore often used in online surveys, an experimental design was applied. For this purpose, an unusually long questionnaire (23 online-pages, 121 items) was developed, wherein different motivation strategies were included. 14.2% of the students (n=17,491) invited to take part in the survey reacted to the invitation, 1916 (11%) answered at least one question; just 7.3% (n=1282) reached the final page.
Results: Drop-out-rates and item-nonresponse-rates differ, depending on the above specified survey context: not being at home and using a smart-phone increases both. The motivation strategies used work differently: they solely reduce the risk of non-responses of those who did not use a smart-phone while at home. However, data cleansing does not affect the sample composition concerning studyrelated characteristics. Detailed analyses show that the influence of the defined survey context on substantial findings varies. Based on this the presentation will emphasize the importance of recording and considering the context-information of data collection for data cleansing, analysis and interpretation of results and will discuss how this
Access/Direct link Conference Homepage (abstract) / (presentation)
Year of publication2017
Bibliographic typeConferences, workshops, tutorials, presentations
Web survey bibliography (1211)
- Using experts’ consensus (the Delphi method) to evaluate weighting techniques in web surveys not...; 2017; Toepoel, V.; Emerson, H.
- Mind the Mode: Differences in Paper vs. Web-Based Survey Modes Among Women With Cancer; 2017; Hagan, T. L.; Belcher, S. M.; Donovan, H. S.
- Ideal and maximum length for a web survey; 2017; Revilla, M.; Ochoa, C.
- Effects of sampling procedure on data quality in a web survey; 2017; Rimac, I.; Ogresta, J.
- Fieldwork monitoring and managing with time-related paradata; 2017; Vandenplas, C.
- Interviewer effects on onliner and offliner participation in the German Internet Panel; 2017; Herzing, J. M. E.; Blom, A. G.; Meuleman, B.
- Nonresponses as context-sensitive response behaviour of participants in online-surveys and their relevance...; 2017; Wetzlehuetter, D.
- Push2web or less is more? Experimental evidence from a mixed-mode population survey at the community...; 2017; Neumann, R.; Haeder, M.; Brust, O.; Dittrich, E.; von Hermanni, H.
- The perils of non-probability sampling; 2017; Bethlehem, J.
- Targeted letters: Effects on sample composition and item non-response; 2017; Bianchi, A.; Biffignandi, S.
- Web Health Monitoring Survey: A New Approach to Enhance the Effectiveness of Telemedicine Systems; 2017; Romano, M. F.; Sardella, M. V.; Alboni, F.
- Device and Internet Use among Spanish-dominant Hispanics: Implications for Web Survey Design and Testing...; 2017; Trejo, Y. A. G.; Schoua-Glusberg, A.
- Data collection mode differences between national face-to-face and web surveys on gender inequality...; 2017; Liu, M.
- A test of sample matching using a pseudo-web sample; 2017; Chatrchi, G., Gambino, J.
- PC, phone or tablet? Use, preference and completion rates for web surveys ; 2017; Brosnan, K.; Gruen, B.; Dolnicar, S.
- Web survey experiments on matrix questions; 2017; Liu, M.
- Overview: Online Surveys; 2017; Vehovar, V.; Lozar Manfreda, K.
- Respondent mode choice in a smartphone survey ; 2017; Conrad, F. G., Schober, M. F., Antoun, C., Yan, H. Y., Hupp, A., Johnston, M., Ehlen, P., Vickers, L...
- Methodological Aspects of Central Left-Right Scale Placement in a Cross-national Perspective; 2016; Scholz, E.; Zuell, C.
- Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys 2016; 2016
- Using Visual Analogue Scales in eHealth: Non-Response Effects in a Lifestyle Intervention; 2016; Kuhlmann, T.; Reips, U.-D.; Wienert, J.; Lippke, S.
- Are Initial Respondents Different from the Nonresponse Follow-Up Cases? A Study of Probability-Based...; 2016; Zeng, W.; Dennis, J. M.
- Predicting and Preventing Break-Offs in Web Surveys; 2016; Mittereder, F.
- Design of Sample Surveys That Complement Observational Data to Achieve Population Coverage; 2016; Slud, E.; Ashmead, R.
- Standard Definitions Final Dispositions of Case Codes and Outcome Rates for Surveys; 2016
- Du kommst hier nicht rein: Türsteherfragen identifizieren nachlässige Teilnehmer in Online-Umfragen; 2016; Merkle, B.; Kaczmirek, L.; Hellwig, O.
- Population Survey Features and Response Rates: A Randomized Experiment; 2016; Guo, Y.; Kopec, J.; Cibere, J.; Li, L. C.; Goldsmith, C. H.
- Mode Effect and Response Rate Issues in Mixed-Mode Survey Research: Implications for Recreational Fisheries...; 2016; Wallen, K. E.; Landon, A. C.; Kyle, G. T.; Schuett, M. A.; Leitz, J.; Kurzawski, K.
- Geht’s auch mit der Maus? – Eine Methodenstudie zu Online-Befragungen in der Jugendforschung...; 2016; Heim, R.; Konowalczyk, S.; Grgic, M.; Seyda, M.; Burrmann, U.; Rauschenbach, T.
- FocusVision 2015 Annual MR Technology Report; 2016; Macer, T., Wilson, S.
- Can Student Populations in Developing Countries Be Reached by Online Surveys? The Case of the National...; 2016; Langer, A., Meuleman, B., Oshodi, A.-G. T., Schroyens, M.
- Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?; 2016; Meitinger, K., Behr, D.
- The impact of survey duration on completion rates among Millennial respondents ; 2016; Coates, D.; Bliss, M.; Vivar, X.
- How to maximize survey response rates ; 2016; DeVall, R.; Colby, C.
- Participation rates of childhood cancer survivors to self-administered questionnaires: a systematic...; 2016; Kilsdonk, E.; Wendel, E.; van Dulmen-den Broeder, E.; van Leeuwen, F.E.; Van Den Berg, M. H.; Jaspers...
- The use of online social networks as a promotional tool for self-administered internet surveys; 2016; de Rada, V. D.; Arino, L. V. C; Blasco, M. G
- The Effects of Pictorial vs. Verbal Examples on Survey Responses ; 2016; Sun, H.; Bertling, J.; Almonte, D.
- Grids and Online Surveys: Do More Complex Grids Induce Survey Satisficing? Evidence from the Gallup...; 2016; Wang, Me.; McCutcheon, A. L.
- The Effect of Emphasizing the Web Option in a Mixed-mode Establishment Survey ; 2016; O'Brien, J.; Rajapaksa, S.; Schafer, B.; Langetieg, P.
- Effect of Clarifying Instructions on Response to Numerical Open-ended Questions in Self-administered...; 2016; Kumar Chaudhary, A.; Israel, G. D.
- Beyond the Survey: Improving Data Insights and User Experience with Mobile Devices ; 2016; Graham, P.; Lew, G.
- User Experience Considerations for Contextual Product Surveys on Smartphones ; 2016; Sedley, A.; Mueller, H.
- The Differential Effect of Mobile-friendly Surveys on Data Quality; 2016; Horwitz, R.
- Assessing Changes in Coverage Bias of Web Surveys a s Internet Access Increases in the United States...; 2016; Sterrett, D.; Malato, D.; Benz, J.; Tompson, T.; English, N.
- Timing is Everything: Discretely Discouraging Mobile Survey Response through the Timing of Email Contacts...; 2016; Richards, A.; C.; Shook-Sa, B. E.; C.; Berzofsky, M.; Smith, A. C.
- Patterns of Unit and Item Nonresponse in a Multinational Web Survey ; 2016; Ackermann, A.; Howard Ecklund, E.; Phillips, B. T.; Brulia, A.
- A Closer Look at Response Time Outliers in Online S urveys Using Paradata Survey Focus ; 2016; Schlosser, S.; Hoehne, J. K.
- Response Order Effects on a Web Survey of Nurse Pra ctitioners ; 2016; Quintana, G.; Riley, L. E.
- Assessing the Effects and Effectiveness of Attention-check Questions in Web Surveys: Evidence From a...; 2016; Vannette, D.
- Effects of an Initial Offering of Multiple Survey Response Options on Response Rates; 2016; Steele, E. A.; Marlar, J.; Allen, L.; Kanitkar, K. N.